Demystifying ResNet
نویسندگان
چکیده
The Residual Network (ResNet), proposed in He et al. (2015a), utilized shortcut connections to significantly reduce the difficulty of training, which resulted in great performance boosts in terms of both training and generalization error. It was empirically observed in He et al. (2015a) that stacking more layers of residual blocks with shortcut 2 results in smaller training error, while it is not true for shortcut of length 1 or 3. We provide a theoretical explanation for the uniqueness of shortcut 2. We show that with or without nonlinearities, by adding shortcuts that have depth two, the condition number of the Hessian of the loss function at the zero initial point is depth-invariant, which makes training very deep models no more difficult than shallow ones. Shortcuts of higher depth result in an extremely flat (high-order) stationary point initially, from which the optimization algorithm is hard to escape. The shortcut 1, however, is essentially equivalent to no shortcuts, which has a condition number exploding to infinity as the number of layers grows. We further argue that as the number of layers tends to infinity, it suffices to only look at the loss function at the zero initial point. Extensive experiments are provided accompanying our theoretical results. We show that initializing the network to small weights with shortcut 2 achieves significantly better results than random Gaussian (Xavier) initialization, orthogonal initialization, and shortcuts of deeper depth, from various perspectives ranging from final loss, learning dynamics and stability, to the behavior of the Hessian along the learning process.
منابع مشابه
Demystifying Deep Learning: A Geometric Approach to Iterative Projections
Parametric approaches to Learning, such as deep learning (DL), are highly popular in nonlinear regression, in spite of their extremely difficult training with their increasing complexity (e.g. number of layers in DL). In this paper, we present an alternative semi-parametric framework which foregoes the ordinarily required feedback, by introducing the novel idea of geometric regularization. We s...
متن کاملSupplementary Material for “DualNet: Learn Complementary Features for Image Recognition”
Besides ResNet-20, we further evaluate DualNet based on the deeper ResNet [6], e.g., with 32 layers and 56 layers (denoted as ResNet-32&ResNet-56, referring to the third-party implementation available at [2]). ResNet32&ResNet-56, as well as the corresponding DualNet (denoted as DNR32&DNR56), are also trained on the augmented CIFAR-100 and the experimental results are shown in Table 1. The perfo...
متن کاملResnet in Resnet: Generalizing Residual Architectures
ResNets have recently achieved state-of-the-art results on challenging computer vision tasks. In this paper, we create a novel architecture that improves ResNets by adding the ability to forget and by making the residuals more expressive, yielding excellent results. ResNet in ResNet outperforms architectures with similar amounts of augmentation on CIFAR-10 and establishes a new state-of-the-art...
متن کاملLearning Deep Resnet Blocks Sequentially
We prove a multiclass boosting theory for the ResNet architectures which simultaneously creates a new technique for multiclass boosting and provides a new algorithm for ResNet-style architectures. Our proposed training algorithm, BoostResNet, is particularly suitable in non-differentiable architectures. Our method only requires the relatively inexpensive sequential training of T “shallow ResNet...
متن کاملLearning Deep ResNet Blocks Sequentially using Boosting Theory
Deep neural networks are known to be difficult to train due to the instability of back-propagation. A deep residual network (ResNet) with identity loops remedies this by stabilizing gradient computations. We prove a boosting theory for the ResNet architecture. We construct T weak module classifiers, each contains two of the T layers, such that the combined strong learner is a ResNet. Therefore,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1611.01186 شماره
صفحات -
تاریخ انتشار 2016